27 research outputs found

    Arrow Contraction and Expansion in Tropical Diagrams

    Get PDF
    Arrow contraction applied to a tropical diagram of probability spaces is a modification of the diagram, replacing one of the morphisms by an isomorphims, while preserving other parts of the diagram. It is related to the rate regions introduced by Ahlswede and K\"orner. In a companion article we use arrow contraction to derive information about the shape of the entropic cone. Arrow expansion is the inverse operation to the arrow contraction.Comment: 20 pages, V2 updated reference

    Tropical probability theory and an application to the entropic cone

    Get PDF
    In a series of articles, we have been developing a theory of tropical diagrams of probability spaces, expecting it to be useful for information optimization problems in information theory and artificial intelligence. In this article, we give a summary of our work so far and apply the theory to derive a dimension-reduction statement about the shape of the entropic cone.Comment: 18 pages, 1 figure, V2 - updated reference

    Tropical Limits of Probability Spaces, Part I: The Intrinsic Kolmogorov-Sinai Distance and the Asymptotic Equipartition Property for Configurations

    Get PDF
    The entropy of a finite probability space XX measures the observable cardinality of large independent products X⊗nX^{\otimes n} of the probability space. If two probability spaces XX and YY have the same entropy, there is an almost measure-preserving bijection between large parts of X⊗nX^{\otimes n} and Y⊗nY^{\otimes n}. In this way, XX and YY are asymptotically equivalent. It turns out to be challenging to generalize this notion of asymptotic equivalence to configurations of probability spaces, which are collections of probability spaces with measure-preserving maps between some of them. In this article we introduce the intrinsic Kolmogorov-Sinai distance on the space of configurations of probability spaces. Concentrating on the large-scale geometry we pass to the asymptotic Kolmogorov-Sinai distance. It induces an asymptotic equivalence relation on sequences of configurations of probability spaces. We will call the equivalence classes \emph{tropical probability spaces}. In this context we prove an Asymptotic Equipartition Property for configurations. It states that tropical configurations can always be approximated by homogeneous configurations. In addition, we show that the solutions to certain Information-Optimization problems are Lipschitz-con\-tinuous with respect to the asymptotic Kolmogorov-Sinai distance. It follows from these two statements that in order to solve an Information-Optimization problem, it suffices to consider homogeneous configurations. Finally, we show that spaces of trajectories of length nn of certain stochastic processes, in particular stationary Markov chains, have a tropical limit.Comment: Comment to version 2: Fixed typos, a calculation mistake in Lemma 5.1 and its consequences in Proposition 5.2 and Theorem 6.

    Conditioning in tropical probability theory

    Get PDF
    We define a natural operation of conditioning of tropical diagrams of probability spaces and show that it is Lipschitz continuous with respect to the asymptotic entropy distance.Comment: 12 pages, V2 - updated reference

    Well-posedness of a parabolic moving-boundary problem in the setting of Wasserstein gradient flows

    Full text link
    We develop a gradient-flow framework based on the Wasserstein metric for a parabolic moving-boundary problem that models crystal dissolution and precipitation. In doing so we derive a new weak formulation for this moving-boundary problem and we show that this formulation is well-posed. In addition, we develop a new uniqueness technique based on the framework of gradient flows with respect to the Wasserstein metric. With this uniqueness technique, the Wasserstein framework becomes a complete well-posedness setting for this parabolic moving-boundary problem.Comment: 26 page

    Diffusion Variational Autoencoders

    Full text link
    A standard Variational Autoencoder, with a Euclidean latent space, is structurally incapable of capturing topological properties of certain datasets. To remove topological obstructions, we introduce Diffusion Variational Autoencoders with arbitrary manifolds as a latent space. A Diffusion Variational Autoencoder uses transition kernels of Brownian motion on the manifold. In particular, it uses properties of the Brownian motion to implement the reparametrization trick and fast approximations to the KL divergence. We show that the Diffusion Variational Autoencoder is capable of capturing topological properties of synthetic datasets. Additionally, we train MNIST on spheres, tori, projective spaces, SO(3), and a torus embedded in R3. Although a natural dataset like MNIST does not have latent variables with a clear-cut topological structure, training it on a manifold can still highlight topological and geometrical properties.Comment: 10 pages, 8 figures Added an appendix with derivation of asymptotic expansion of KL divergence for heat kernel on arbitrary Riemannian manifolds, and an appendix with new experiments on binarized MNIST. Added a previously missing factor in the asymptotic expansion of the heat kernel and corrected a coefficient in asymptotic expansion KL divergence; further minor edit

    Semicontinuity of capacity under pointed intrinsic flat convergence

    Get PDF
    The concept of the capacity of a compact set in Rn\mathbb R^n generalizes readily to noncompact Riemannian manifolds and, with more substantial work, to metric spaces (where multiple natural definitions of capacity are possible). Motivated by analytic and geometric considerations, and in particular Jauregui's definition of capacity-volume mass and Jauregui and Lee's results on the lower semicontinuity of the ADM mass and Huisken's isoperimetric mass, we investigate how the capacity functional behaves when the background spaces vary. Specifically, we allow the background spaces to consist of a sequence of local integral current spaces converging in the pointed Sormani--Wenger intrinsic flat sense. For the case of volume-preserving (VF\mathcal{VF}) convergence, we prove two theorems that demonstrate an upper semicontinuity phenomenon for the capacity: one version is for balls of a fixed radius centered about converging points; the other is for Lipschitz sublevel sets. Our approach is motivated by Portegies' investigation of the semicontinuity of eigenvalues under VF\mathcal{VF} convergence. We include examples to show the semicontinuity may be strict, and that the volume-preserving hypothesis is necessary. Finally, there is a discussion on how capacity and our results may be used towards understanding the general relativistic total mass in non-smooth settings
    corecore